New l1-Norm Relaxations and Optimizations for Graph Clustering

نویسندگان

  • Feiping Nie
  • Hua Wang
  • Cheng Deng
  • Xinbo Gao
  • Xuelong Li
  • Heng Huang
چکیده

In recent data mining research, the graph clustering methods, such as normalized cut and ratio cut, have been well studied and applied to solve many unsupervised learning applications. The original graph clustering methods are NP-hard problems. Traditional approaches used spectral relaxation to solve the graph clustering problems. The main disadvantage of these approaches is that the obtained spectral solutions could severely deviate from the true solution. To solve this problem, in this paper, we propose a new relaxation mechanism for graph clustering methods. Instead of minimizing the squared distances of clustering results, we use the 1-norm distance. More important, considering the normalized consistency, we also use the 1norm for the normalized terms in the new graph clustering relaxations. Due to the sparse result from the 1-norm minimization, the solutions of our new relaxed graph clustering methods get discrete values with many zeros, which are close to the ideal solutions. Our new objectives are difficult to be optimized, because the minimization problem involves the ratio of nonsmooth terms. The existing sparse learning optimization algorithms cannot be applied to solve this problem. In this paper, we propose a new optimization algorithm to solve this difficult non-smooth ratio minimization problem. The extensive experiments have been performed on three two-way clustering and eight multi-way clustering benchmark data sets. All empirical results show that our new relaxation methods consistently enhance the normalized cut and ratio cut clustering results. Introduction Clustering is an important task in computer vision and machine learning research with many applications, such as image segmentation (Shi and Malik 2000), image categorization (Grauman and Darrell 2006), scene analysis (Koppal and Narasimhan 2006), motion modeling (P.Ochs and T.Brox 2012), and medical image analysis (Brun, Park, and ∗To whom all correspondence should be addressed. This work was partially supported by the following grants: NSF-IIS 1117965, NSF-IIS 1302675, NSF-IIS 1344152, NSF-DBI 1356628, NSF-IIS 1423591, NIH R01 AG049371. Copyright c © 2016, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. Shenton 2004). In the past decades, many clustering algorithms have been proposed. Among these approaches, the use of manifold information in graph clustering has shown the state-of-the-art clustering performance. The graph based clustering methods model the data as a weighted undirected graph based on the pair-wise similarities. Clustering is then accomplished by finding the best cuts of the graph that optimize the predefined cost functions. Two types of graph clustering methods, normalized cut (Shi and Malik 2000) and ratio cut (Cheng and Wei 1991; Hagen and Kahng 1992), are popularly used to solve the clustering problems due to their good clustering performance. Solving the graph clustering problem is a difficult task (NP-hard problems). The main difficulty of the graph clustering problem comes from the constrains on the solution. It is hard to solve the graph clustering problems exactly. However, the approximation solutions are possible with spectral relaxations. The optimization usually leads to the computation of the top eigenvectors of certain graph affinity matrices, and the clustering result can be derived from the obtained eigen-space. However, the traditional spectral relaxations lead the non-optimal clustering results. The spectral solutions don’t directly provide the clustering results and the thresholding post-processing has to be applied, such that the results often severely deviate from the true solution. More recently, tight relaxations of balanced graph clustering methods were proposed (Bühler and Hein 2009; Luo et al. 2010; Hein and Setzer 2011), and gradient based method was used to solve the problem, which is time consuming and slow to converge in practice. In order to solve the above challenging issues, in this paper, we revisit the normalized cut and ratio cut methods, and propose new relaxations for these methods to achieve the discrete and sparse clustering results which are close to the ideal solutions. Instead of minimizing the projected squared clustering indictors distance, we minimize the 1 distance. Meanwhile, our new relaxations also use the 1-norm for the normalization terms. Due to the 1-normminimization, most elements of each clustering indictor are enforced to be zero and hence the clustering results are close the ideal solutions. However, our new relations introduce a difficult optimization problem which optimizes the ratio of two nonsmooth terms. The standard optimization methods for sparse learning, such as Proximal Gradient, Iterative ShrinkageProceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Robust Graph Regularisation for Subspace Clustering

Various subspace clustering methods have benefited from introducing a graph regularisation term in their objective functions. In this work, we identify two critical limitations of the graph regularisation term employed in existing subspace clustering models and provide solutions for both of them. First, the squared l2-norm used in the existing term is replaced by a l1-norm term to make the regu...

متن کامل

Regularized l1-Graph for Data Clustering

l1-Graph has been proven to be effective in data clustering, which partitions the data space by using the sparse representation of the data as the similarity measure. However, the sparse representation is performed for each datum independently without taking into account the geometric structure of the data. Motivated by l1-Graph and manifold leaning, we propose Regularized l1-Graph (Rl1-Graph) ...

متن کامل

Learning With ℓ1-Graph for Image Analysis

The graph construction procedure essentially determines the potentials of those graph-oriented learning algorithms for image analysis. In this paper, we propose a process to build the so-called directed l1-graph, in which the vertices involve all the samples and the ingoing edge weights to each vertex describe its l1-norm driven reconstruction from the remaining samples and the noise. Then, a s...

متن کامل

Convex Relaxations of Bregman Divergence Clustering

Although many convex relaxations of clustering have been proposed in the past decade, current formulations remain restricted to spherical Gaussian or discriminative models and are susceptible to imbalanced clusters. To address these shortcomings, we propose a new class of convex relaxations that can be flexibly applied to more general forms of Bregman divergence clustering. By basing these new ...

متن کامل

The Constrained Laplacian Rank Algorithm for Graph-Based Clustering

Graph-based clustering methods perform clustering on a fixed input data graph. If this initial construction is of low quality then the resulting clustering may also be of low quality. Moreover, existing graph-based clustering methods require post-processing on the data graph to extract the clustering indicators. We address both of these drawbacks by allowing the data graph itself to be adjusted...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016